Search Results
ML Seminars - On the convergence of gradient descent for wide two-layer neural networks
OWOS: Francis Bach - "On the Convergence of Gradient Descent for Wide Two-Layer Neural Networks"
Francis Bach - On the convergence of gradient descent for wide two-layer neural networks
Lenaïc Chizat - Analysis of Gradient Descent on Wide Two-Layer Neural Networks
Lénaïc Chizat - Analysis of Gradient Descent on Wide Two-Layer ReLU Neural Networks
Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss
On the Global Convergence of Gradient Descent for (...) - Bach - Workshop 3 - CEB T1 2019
Rong Ge (Duke): A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Net
Feature Selection with Gradient Descent on Two-layer Networks in Low-rotation Regimes
Lecture 8 part 1: Deep Neural Networks
Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient Descent
Andrea Agazzi - Convergence & optimality of single-layer neural networks for reinforcement learning